2,697 research outputs found

    Information entropy and nucleon correlations in nuclei

    Full text link
    The information entropies in coordinate and momentum spaces and their sum (SrS_r, SkS_k, SS) are evaluated for many nuclei using "experimental" densities or/and momentum distributions. The results are compared with the harmonic oscillator model and with the short-range correlated distributions. It is found that SrS_r depends strongly on lnA\ln A and does not depend very much on the model. The behaviour of SkS_k is opposite. The various cases we consider can be classified according to either the quantity of the experimental data we use or by the values of SS, i.e., the increase of the quality of the density and of the momentum distributions leads to an increase of the values of SS. In all cases, apart from the linear relation S=a+blnAS=a+b\ln A, the linear relation S=aV+bVlnVS=a_V+b_V \ln V also holds. V is the mean volume of the nucleus. If SS is considered as an ensemble entropy, a relation between AA or VV and the ensemble volume can be found. Finally, comparing different electron scattering experiments for the same nucleus, it is found that the larger the momentum transfer ranges, the larger the information entropy is. It is concluded that SS could be used to compare different experiments for the same nucleus and to choose the most reliable one.Comment: 14 pages, 4 figures, 2 table

    Maximum-Entropy Principle in Flexible Manufacturing Systems

    Get PDF
    It is shown that the entropy of the joint probability distribution of the queue lengths of the M machine-groups in a closed queuing network model of a flexible manufacturing system is maximum when the loads on the different machine-groups are equal for both single-machine machine-groups and for those multiple-machine machine-groups of which group sizes are equal. It is also shown that for unequal machine-group-sizes, the entropy is not maximum when the workload is balanced. The simultaneous variations of the entropy function of the load distribution, the entropy function of joint probability distribution lengths of queues and the expected production function are studied in order to investigate the relationship between the information content and productive capacity of manufacturing systems. Four measures of load balance in a flexible manufacturing system are given

    Entropy, Optimization and Counting

    Full text link
    In this paper we study the problem of computing max-entropy distributions over a discrete set of objects subject to observed marginals. Interest in such distributions arises due to their applicability in areas such as statistical physics, economics, biology, information theory, machine learning, combinatorics and, more recently, approximation algorithms. A key difficulty in computing max-entropy distributions has been to show that they have polynomially-sized descriptions. We show that such descriptions exist under general conditions. Subsequently, we show how algorithms for (approximately) counting the underlying discrete set can be translated into efficient algorithms to (approximately) compute max-entropy distributions. In the reverse direction, we show how access to algorithms that compute max-entropy distributions can be used to count, which establishes an equivalence between counting and computing max-entropy distributions

    Information of orderα and typeβ

    Get PDF
    Information Iα β (Q/P) of orderα and typeβ is introduced and it is shown that for every fixedβ, this information is a monotonic increasing function of α. It is also shown that information of ordera and type 1 is non-negative when ∑ qk ≥ ∑pk , where (q 1,q 2 …,q N) and (p 1,p 2, …,p N) are generalised probability distributions for Q and P respectively

    Experimental and numerical modelling of aerated flows over stepped spillways

    Get PDF
    Stepped spillways are a popular design choice for reservoir overflows due to the high rates of energy dissipation and air entrainment compared to smooth spillways. Air entrainment is important in spillway flows as it affects the pressures acting on the spillway surface, which in adverse conditions can damage the spillway. Air entrainment also causes flow bulking which increases the depth of flow. This study presents free surface and pressure data for aerated flows over an experimental stepped spillway, with pressures measured at different positions across the width of the channel. Within the step cavities, recirculating vortices are observed in both the stream-wise and cross-stream directions, with the direction of circulation alternating at each subsequent step. These 3D effects cause the pressures acting on the step edges to vary across the width of the channel. The Volume of Fluid (VOF) and Eulerian multiphase numerical models are used to predict flows over the spillway. The Eulerian multiphase model shows high levels of air entrainment and is able to predict the position of the free surface to reasonable accuracy. The VOF model, conversely, does not show any air entrainment and therefore under predicts the position of the free surface. The accuracy to which each numerical model predicts pressures on the step faces varies depending on the measurement location. Both of the numerical models accurately simulate the direction of circulation of the 3D vortices within the step cavities. Simulations with varying channel widths, conducted using the VOF model, show that the pattern of 3D vortices repeats as the channel width is increased

    Systematic Review and Meta-Analysis of Brief Cognitive Instruments to Evaluate Suspected Dementia in Chinese-Speaking Populations

    Get PDF
    Background: Chinese is the most commonly spoken world language; however, most cognitive tests were developed and validated in the West. It is essential to find out which tests are valid and practical in Chinese speaking people with suspected dementia. Objective: We therefore conducted a systematic review and meta-Analysis of brief cognitive tests adapted for Chinese-speaking populations in people presenting for assessment of suspected dementia. Methods: We searched electronic databases for studies reporting brief (≤20 minutes) cognitive test's sensitivity and specificity as part of dementia diagnosis for Chinese-speaking populations in clinical settings. We assessed quality using Centre for Evidence Based Medicine (CEBM) criteria and translation and cultural adaptation using the Manchester Translation Reporting Questionnaire (MTRQ), and Manchester Cultural Adaptation Reporting Questionnaire (MCAR). We assessed heterogeneity and combined sensitivity in meta-Analyses. Results: 38 studies met inclusion criteria and 22 were included in meta-Analyses. None met the highest CEBM criteria. Five studies met the highest criteria of MTRQ and MCAR. In meta-Analyses of studies with acceptable heterogeneity (I2 <  75%), Addenbrooke's Cognitive Examination Revised III (ACE-R ACE-III) had the best sensitivity and specificity; specifically, for dementia (93.5% 85.6%) and mild cognitive impairment (81.4% 76.7%). Conclusions: Current evidence is that the ACE-R and ACE-III are the best brief cognitive assessments for dementia and mild cognitive impairment in Chinese-speaking populations. They may improve time taken to diagnosis, allowing people to access interventions and future planning

    Consistent Application of Maximum Entropy to Quantum-Monte-Carlo Data

    Full text link
    Bayesian statistics in the frame of the maximum entropy concept has widely been used for inferential problems, particularly, to infer dynamic properties of strongly correlated fermion systems from Quantum-Monte-Carlo (QMC) imaginary time data. In current applications, however, a consistent treatment of the error-covariance of the QMC data is missing. Here we present a closed Bayesian approach to account consistently for the QMC-data.Comment: 13 pages, RevTeX, 2 uuencoded PostScript figure

    Computationally Efficient Implementation of Convolution-based Locally Adaptive Binarization Techniques

    Full text link
    One of the most important steps of document image processing is binarization. The computational requirements of locally adaptive binarization techniques make them unsuitable for devices with limited computing facilities. In this paper, we have presented a computationally efficient implementation of convolution based locally adaptive binarization techniques keeping the performance comparable to the original implementation. The computational complexity has been reduced from O(W2N2) to O(WN2) where WxW is the window size and NxN is the image size. Experiments over benchmark datasets show that the computation time has been reduced by 5 to 15 times depending on the window size while memory consumption remains the same with respect to the state-of-the-art algorithmic implementation

    Towards the Formal Reliability Analysis of Oil and Gas Pipelines

    Get PDF
    It is customary to assess the reliability of underground oil and gas pipelines in the presence of excessive loading and corrosion effects to ensure a leak-free transport of hazardous materials. The main idea behind this reliability analysis is to model the given pipeline system as a Reliability Block Diagram (RBD) of segments such that the reliability of an individual pipeline segment can be represented by a random variable. Traditionally, computer simulation is used to perform this reliability analysis but it provides approximate results and requires an enormous amount of CPU time for attaining reasonable estimates. Due to its approximate nature, simulation is not very suitable for analyzing safety-critical systems like oil and gas pipelines, where even minor analysis flaws may result in catastrophic consequences. As an accurate alternative, we propose to use a higher-order-logic theorem prover (HOL) for the reliability analysis of pipelines. As a first step towards this idea, this paper provides a higher-order-logic formalization of reliability and the series RBD using the HOL theorem prover. For illustration, we present the formal analysis of a simple pipeline that can be modeled as a series RBD of segments with exponentially distributed failure times.Comment: 15 page
    corecore